Gaussian Process Networks
نویسندگان
چکیده
In this paper we address the problem of learning the structure of a Bayesian network in domains with continuous variables. This task requires a procedure for comparing different candidate structures. In the Bayesian framework, this is done by evaluating the marginal likelihood of the data given a candidate structure. This term can be computed in closed-form for standard parametric families (e.g., Gaussians), and can be approximated, at some computational cost, for some semi-parametric families (e.g., mixtures of Gaussians).
منابع مشابه
Novel Radial Basis Function Neural Networks based on Probabilistic Evolutionary and Gaussian Mixture Model for Satellites Optimum Selection
In this study, two novel learning algorithms have been applied on Radial Basis Function Neural Network (RBFNN) to approximate the functions with high non-linear order. The Probabilistic Evolutionary (PE) and Gaussian Mixture Model (GMM) techniques are proposed to significantly minimize the error functions. The main idea is concerning the various strategies to optimize the procedure of Gradient ...
متن کاملA hybrid method to find cumulative distribution function of completion time of GERT networks
This paper proposes a hybrid method to find cumulative distribution function (CDF) of completion time of GERT-type networks (GTN) which have no loop and have only exclusive-or nodes. Proposed method is cre-ated by combining an analytical transformation with Gaussian quadrature formula. Also the combined crude Monte Carlo simulation and combined conditional Monte Carlo simulation are developed a...
متن کاملGaussian Processes as Neural Networks
An implementation of a Gaussian process prediction mechanism using biologically plausible processes in the framework of a neural network.
متن کاملGaussian Process Regression Networks
We introduce a new regression framework, Gaussian process regression networks (GPRN), which combines the structural properties of Bayesian neural networks with the nonparametric flexibility of Gaussian processes. This model accommodates input dependent signal and noise correlations between multiple response variables, input dependent length-scales and amplitudes, and heavy-tailed predictive dis...
متن کاملTechreport Ncrg/97/025. under Review for Neural Computation Computation with Innnite Neural Networks
For neural networks with a wide class of weight priors, it can be shown that in the limit of an innnite number of hidden units the prior over functions tends to a Gaussian process. In this paper analytic forms are derived for the covariance function of the Gaussian processes corresponding to networks with sigmoidal and Gaussian hidden units. This allows predictions to be made eeciently using ne...
متن کاملImplementing Gaussian process inference with neural networks.
Gaussian processes compare favourably with backpropagation neural networks as a tool for regression, and Bayesian neural networks have Gaussian process behaviour when the number of hidden neurons tends to infinity. We describe a simple recurrent neural network with connection weights trained by one-shot Hebbian learning. This network amounts to a dynamical system which relaxes to a stable state...
متن کامل